Riemannian Metric Learning for Symmetric Positive Definite Matrices

نویسندگان

  • Raviteja Vemulapalli
  • David W. Jacobs
چکیده

Over the past few years, symmetric positive definite matrices (SPD) have been receiving considerable attention from computer vision community. Though various distance measures have been proposed in the past for comparing SPD matrices, the two most widely-used measures are affine-invariant distance and log-Euclidean distance. This is because these two measures are true geodesic distances induced by Riemannian geometry. In this work, we focus on the log-Euclidean Riemannian geometry and propose a data-driven approach for learning Riemannian metrics/geodesic distances for SPD matrices. We show that the geodesic distance learned using the proposed approach performs better than various existing distance measures when evaluated on face matching and clustering tasks. Notations – I denotes the identity matrix of appropriate size. – 〈 , 〉 denotes an inner product. – Sn denotes the set of n× n symmetric matrices. – S n denotes the set of n× n symmetric positive definite matrices. – TpM denotes the tangent space to the manifold M at the point p ∈ M. – ‖ ‖F denotes the matrix Frobenius norm. – Chol(P) denotes the lower triangular matrix obtained from the Cholesky decomposition of a matrix P. – exp() and log() denote matrix exponential and logarithm respectively. – ∂ ∂x and ∂ 2 ∂x represent partial derivatives. 2 Raviteja Vemulapalli, David W. Jacobs

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Supervised LogEuclidean Metric Learning for Symmetric Positive Definite Matrices

Metric learning has been shown to be highly effective to improve the performance of nearest neighbor classification. In this paper, we address the problem of metric learning for symmetric positive definite (SPD) matrices such as covariance matrices, which arise in many real-world applications. Naively using standard Mahalanobis metric learning methods under the Euclidean geometry for SPD matric...

متن کامل

A Differential Geometric Approach to the Geometric Mean of Symmetric Positive-Definite Matrices

In this paper we introduce metric-based means for the space of positive-definite matrices. The mean associated with the Euclidean metric of the ambient space is the usual arithmetic mean. The mean associated with the Riemannian metric corresponds to the geometric mean. We discuss some invariance properties of the Riemannian mean and we use differential geometric tools to give a characterization...

متن کامل

Approximating Sets of Symmetric and Positive-Definite Matrices by Geodesics

We formulate a generalized version of the classical linear regression problem on Riemannian manifolds and derive the counterpart to the normal equations for the manifold of symmetric and positive definite matrices, equipped with the only metric that is invariant under the natural action of the general linear group.

متن کامل

Learning Discriminative αβ-Divergences for Positive Definite Matrices

Symmetric positive definite (SPD) matrices are useful for capturing second-order statistics of visual data. To compare two SPD matrices, several measures are available, such as the affine-invariant Riemannian metric, Jeffreys divergence, Jensen-Bregman logdet divergence, etc.; however, their behaviors may be application dependent, raising the need of manual selection to achieve the best possibl...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1501.02393  شماره 

صفحات  -

تاریخ انتشار 2015